3 research outputs found

    Inferring Network Usage from Passive Measurements in ISP Networks: Bringing Visibility of the Network to Internet Operators

    Get PDF
    The Internet is evolving with us along the time, nowadays people are more dependent of it, being used for most of the simple activities of their lives. It is not uncommon use the Internet for voice and video communications, social networking, banking and shopping. Current trends in Internet applications such as Web 2.0, cloud computing, and the internet of things are bound to bring higher traffic volume and more heterogeneous traffic. In addition, privacy concerns and network security traits have widely promoted the usage of encryption on the network communications. All these factors make network management an evolving environment that becomes every day more difficult. This thesis focuses on helping to keep track on some of these changes, observing the Internet from an ISP viewpoint and exploring several aspects of the visibility of a network, giving insights on what contents or services are retrieved by customers and how these contents are provided to them. Generally, inferring these information, it is done by means of characterization and analysis of data collected using passive traffic monitoring tools on operative networks. As said, analysis and characterization of traffic collected passively is challenging. Internet end-users are not controlled on the network traffic they generate. Moreover, this traffic in the network might be encrypted or coded in a way that is unfeasible to decode, creating the need for reverse engineering for providing a good picture to the Internet operator. In spite of the challenges, it is presented a characterization of P2P-TV usage of a commercial, proprietary and closed application, that encrypts or encodes its traffic, making quite difficult discerning what is going on by just observing the data carried by the protocol. Then it is presented DN-Hunter, which is an application for rendering visible a great part of the network traffic even when encryption or encoding is available. Finally, it is presented a case study of DNHunter for understanding Amazon Web Services, the most prominent cloud provider that offers computing, storage, and content delivery platforms. In this paper is unveiled the infrastructure, the pervasiveness of content and their traffic allocation policies. Findings reveal that most of the content residing on cloud computing and Internet storage infrastructures is served by one single Amazon datacenter located in Virginia despite it appears to be the worst performing one for Italian users. This causes traffic to take long and expensive paths in the network. Since no automatic migration and load-balancing policies are offered by AWS among different locations, content is exposed to outages, as it is observed in the datasets presented

    Discerning web content and services based on real-time DNS tagging

    No full text
    A method for profiling network traffic of a network, including obtaining a plurality of flows comprising a plurality of client IP addresses, a plurality of server IP addresses, and a plurality of server ports, extracting a plurality of fully qualified domain names (FQDNs) from a plurality of DNS flows in the network traffic, analyzing correlation between the plurality of flows and the plurality of FQDNs to generate a result, and presenting the result to an administrator user of the network

    Discerning web content and services based on real-time DNS tagging

    No full text
    A method for profiling network traffic of a network, including obtaining a plurality of flows comprising a plurality of client IP addresses, a plurality of server IP addresses, and a plurality of server ports, extracting a plurality of fully qualified domain names (FQDNs) from a plurality of DNS flows in the network traffic, analyzing correlation between the plurality of flows and the plurality of FQDNs to generate a result, and presenting the result to an administrator user of the network
    corecore